Provable approximation properties for deep neural networks

نویسندگان

  • Uri Shaham
  • Alexander Cloninger
  • Ronald R. Coifman
چکیده

We discuss approximation of functions using deep neural nets. Given a function f on a d-dimensional manifold Γ ⊂ R, we construct a sparsely-connected depth-4 neural network and bound its error in approximating f . The size of the network depends on dimension and curvature of the manifold Γ, the complexity of f , in terms of its wavelet description, and only weakly on the ambient dimension m. Essentially, our network computes wavelet functions, which are computed from Rectified Linear Units (ReLU).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Dual Approach to Scalable Verification of Deep Networks

This paper addresses the problem of formally verifying desirable properties of neural networks, i.e., obtaining provable guarantees that the outputs of the neural network will always behave in a certain way for a given class of inputs. Most previous work on this topic was limited in its applicability by the size of the network, network architecture and the complexity of properties to be verifie...

متن کامل

Posterior Concentration for Sparse Deep Learning

Spike-and-Slab Deep Learning (SS-DL) is a fully Bayesian alternative to Dropout for improving generalizability of deep ReLU networks. This new type of regularization enables provable recovery of smooth input-output maps with unknown levels of smoothness. Indeed, we show that the posterior distribution concentrates at the near minimax rate for α-Hölder smooth maps, performing as well as if we kn...

متن کامل

Porosity classification from thin sections using image analysis and neural networks including shallow and deep learning in Jahrum formation

The porosity within a reservoir rock is a basic parameter for the reservoir characterization. The present paper introduces two intelligent models for identification of the porosity types using image analysis. For this aim, firstly, thirteen geometrical parameters of pores of each image were extracted using the image analysis techniques. The extracted features and their corresponding pore types ...

متن کامل

Optimal Approximation with Sparsely Connected Deep Neural Networks

We derive fundamental lower bounds on the connectivity and the memory requirements of deep neural networks guaranteeing uniform approximation rates for arbitrary function classes inL2(Rd). In other words, we establish a connection between the complexity of a function class and the complexity of deep neural networks approximating functions from this class to within a prescribed accuracy. Additio...

متن کامل

Provable Methods for Training Neural Networks with Sparse Connectivity

We provide novel guaranteed approaches for training feedforward neural networks with sparse connectivity. We leverage on the techniques developed previously for learning linear networks and show that they can also be effectively adopted to learn non-linear networks. We operate on the moments involving label and the score function of the input, and show that their factorization provably yields t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1509.07385  شماره 

صفحات  -

تاریخ انتشار 2015